Incremental Learning of Temporally-Coherent Gaussian Mixture Models

نویسندگان

  • Ognjen Arandjelovic
  • Roberto Cipolla
چکیده

In this paper we address the problem of learning Gaussian Mixture Models (GMMs) incrementally. Unlike previous approaches which universally assume that new data comes in blocks representable by GMMs which are then merged with the current model estimate, our method works for the case when novel data points arrive oneby-one, while requiring little additional memory. We keep only two GMMs in the memory and no historical data. The current fit is updated with the assumption that the number of components is fixed, which is increased (or reduced) when enough evidence for a new component is seen. This is deduced from the change from the oldest fit of the same complexity, termed the Historical GMM, the concept of which is central to our method. The performance of the proposed method is demonstrated qualitatively and quantitatively on several synthetic data sets and video sequences of faces acquired in realistic imaging conditions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Incremental Learning of Multivariate Gaussian Mixture Models

This paper presents a new algorithm for unsupervised incremental learning based on a Bayesian framework. The algorithm, called IGMM (for Incremental Gaussian Mixture Model), creates and continually adjusts a Gaussian Mixture Model consistent to all sequentially presented data. IGMM is particularly useful for on-line incremental clustering of data streams, as encountered in the domain of mobile ...

متن کامل

Scalable and Incremental Learning of Gaussian Mixture Models

This work presents a fast and scalable algorithm for incremental learning of Gaussian mixture models. By performing rank-one updates on its precision matrices and determinants, its asymptotic time complexity is of O ( NKD ) for N data points, K Gaussian components and D dimensions. The resulting algorithm can be applied to high dimensional tasks, and this is confirmed by applying it to the clas...

متن کامل

Incremental learning with Gaussian mixture models

In this paper we propose a new incremental estimation of Gaussian mixture models which can be used for applications of online learning. Our approach allows for adding new samples incrementally as well as removing parts of the mixture by the process of unlearning. Low complexity of the mixtures is maintained through a novel compression algorithm. In contrast to the existing approaches, our appro...

متن کامل

IGMN: An Incremental Gaussian Mixture Network that Learns Instantaneously from Data Flows

This works proposes IGMN (standing for Incremental Gaussian Mixture Network), a new connectionist approach for incremental concept formation and robotic tasks. It is inspired on recent theories about the brain, specially the Memory-Prediction Framework and the Constructivist Artificial Intelligence, which endows it with some unique features that are not present in most ANN models such as MLP an...

متن کامل

Incremental Learning of Gaussian Mixture Models

Gaussian Mixture Modeling (GMM) is a parametric method for high dimensional density estimation. Incremental learning of GMM is very important in problems such as clustering of streaming data and robot localization in dynamic environments. Traditional GMM estimation algorithms like EM Clustering tend to be computationally very intensive in these scenarios. We present an incremental GMM estimatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005